Learning Binary Perceptrons Perfectly Efficiently

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Learning Perceptrons with Binary Weights

We present an algorithm that PAC learns any perceptron with binary weights and arbitrary threshold under the family of product distributions. The sample complexity of this algorithm is of O((n/2)4 ln(n/δ)) and its running time increases only linearly with the number of training examples. The algorithm does not try to find an hypothesis that agrees with all of the training examples; rather, it c...

متن کامل

Convergence of stochastic learning in perceptrons with binary synapses.

The efficacy of a biological synapse is naturally bounded, and at some resolution, and is discrete at the latest level of single vesicles. The finite number of synaptic states dramatically reduce the storage capacity of a network when online learning is considered (i.e., the synapses are immediately modified by each pattern): the trace of old memories decays exponentially with the number of new...

متن کامل

PhysRevE Convergence of stochastic learning in perceptrons with binary synapses

The efficacy of a biological synapse is naturally bounded, and at some resolution, latest at the level of single vesicles, it is discrete. The finite number of synaptic states dramatically reduce the storage capacity of a network when online learning is considered (i.e. the synapses are immediately modified by each pattern): the trace of old memories decays exponentially with the number of new ...

متن کامل

Learning Sparse Perceptrons

We introduce a new algorithm designed to learn sparse perceptrons over input representations which include high-order features. Our algorithm, which is based on a hypothesis-boosting method, is able to PAC-learn a relatively natural class of target concepts. Moreover, the algorithm appears to work well in practice: on a set of three problem domains, the algorithm produces classifiers that utili...

متن کامل

Bounds on the Degree of High Order Binary Perceptrons Bounds on the Degree of High Order Binary Perceptrons

High order perceptrons are often used in order to reduce the size of neural networks. The complexity of the architecture of a usual multilayer network is then turned into the complexity of the functions performed by each high order unit and in particular by the degree of their polynomials. The main result of this paper provides a bound on the degree of the polynomial of a high order perceptron,...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Computer and System Sciences

سال: 1996

ISSN: 0022-0000

DOI: 10.1006/jcss.1996.0028